On Estimating L2 Divergence

نویسندگان

  • Akshay Krishnamurthy
  • Kirthevasan Kandasamy
  • Larry Wasserman
چکیده

We give a comprehensive theoretical characterization of a nonparametric estimator for the L2 divergence between two continuous distributions. We first bound the rate of convergence of our estimator, showing that it is √ n-consistent provided the densities are sufficiently smooth. In this smooth regime, we then show that our estimator is asymptotically normal, construct asymptotic confidence intervals, and establish a Berry-Esséen style inequality characterizing the rate of convergence to normality. We also show that this estimator is minimax optimal.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Estimating L22 Divergence

We give a comprehensive theoretical characterization of a nonparametric estimator for the L2 divergence between two continuous distributions. We first bound the rate of convergence of our estimator, showing that it is √ n-consistent provided the densities are sufficiently smooth. In this smooth regime, we then show that our estimator is asymptotically normal, construct asymptotic confidence int...

متن کامل

Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning

Approximating a divergence between two probability distributions from their samples is a fundamental challenge in statistics, information theory, and machine learning. A divergence approximator can be used for various purposes such as two-sample homogeneity testing, change-point detection, and class-balance estimation. Furthermore, an approximator of a divergence between the joint distribution ...

متن کامل

On the Convergence Properties of Contrastive Divergence

Contrastive Divergence (CD) is a popular method for estimating the parameters of Markov Random Fields (MRFs) by efficiently approximating an intractable term in the gradient of the MRF’s log probability. Despite its empirical success, basic theoretical questions on its convergence properties are currently open. In this paper, we analyze the CD1 update rule for Restricted Boltzmann Machines (RBM...

متن کامل

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Generalized L2-Divergence and Its Application to Shape Alignment

This paper proposes a novel and robust approach to the groupwise point-sets registration problem in the presence of large amounts of noise and outliers. Each of the point sets is represented by a mixture of Gaussians and the point-sets registration is treated as a problem of aligning the multiple mixtures. We develop a novel divergence measure which is defined between any arbitrary number of pr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015